Consistent Bayesian information criterion based on a mixture prior for possibly high‐dimensional multivariate linear regression models

نویسندگان

چکیده

In the problem of selecting variables in a multivariate linear regression model, we derive new Bayesian information criteria based on prior mixing smooth distribution and delta distribution. Each them can be interpreted as fusion Akaike criterion (AIC) (BIC). Inheriting their asymptotic properties, our are consistent variable selection both large-sample high-dimensional frameworks. numerical simulations, methods choose true set with high probability most cases.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

General Hyperplane Prior Distributions Based on Geometric Invariances for Bayesian Multivariate Linear Regression

Based on geometric invariance properties, we derive an explicit prior distribution for the parameters of multivariate linear regression problems in the absence of further prior information. The problem is formulated as a rotationally-invariant distribution of L-dimensional hyperplanes inN dimensions, and the associated system of partial differential equations is solved. The derived prior distri...

متن کامل

Prior Information Based Bayesian Infinite Mixture Model

Unsupervised learning methods have been tremendously successful in extracting knowledge from genomics data generated by high throughput experimental assays. However, analysis of each dataset in isolation without incorporating potentially informative prior knowledge is limiting the utility of such procedures. Here we present a novel probabilistic model and computational algorithm for semi-superv...

متن کامل

Multivariate linear regression with non-normal errors: a solution based on mixture models

In some situations, the distribution of the error terms of a multivariate linear regression model may depart from normality. This problem has been addressed, for example, by specifying a different parametric distribution family for the error terms, such as multivariate skewed and/or heavy-tailed distributions. A new solution is proposed, which is obtained by modelling the error term distributio...

متن کامل

Extending the Akaike Information Criterion to Mixture Regression Models

We examine the problem of jointly selecting the number of components and variables in finite mixture regression models. We find that the Akaike information criterion is unsatisfactory for this purpose because it overestimates the number of components, which in turn results in incorrect variables being retained in the model. Therefore, we derive a new information criterion, the mixture regressio...

متن کامل

A Bayesian information criterion for singular models

We consider approximate Bayesian model choice for model selection problems that involve models whose Fisher-information matrices may fail to be invertible along other competing submodels. Such singular models do not obey the regularity conditions underlying the derivation of Schwarz’s Bayesian information criterion (BIC) and the penalty structure in BIC generally does not reflect the frequentis...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Scandinavian Journal of Statistics

سال: 2022

ISSN: ['0303-6898', '1467-9469']

DOI: https://doi.org/10.1111/sjos.12617